Search results for "Radial basis function kernel"
showing 10 items of 24 documents
Semi-Supervised Support Vector Biophysical Parameter Estimation
2008
Two kernel-based methods for semi-supervised regression are presented. The methods rely on building a graph or hypergraph Laplacian with both the labeled and unlabeled data, which is further used to deform the training kernel matrix. The deformed kernel is then used for support vector regression (SVR). The semi-supervised SVR methods are sucessfully tested in LAI estimation and ocean chlorophyll concentration prediction from remotely sensed images.
Regularized RBF Networks for Hyperspectral Data Classification
2004
In this paper, we analyze several regularized types of Radial Basis Function (RBF) Networks for crop classification using hyperspectral images. We compare the regularized RBF neural network with Support Vector Machines (SVM) using the RBF kernel, and AdaBoost Regularized (ABR) algorithm using RBF bases, in terms of accuracy and robustness. Several scenarios of increasing input space dimensionality are tested for six images containing six crop classes. Also, regularization, sparseness, and knowledge extraction are paid attention.
Optimizing Kernel Ridge Regression for Remote Sensing Problems
2018
Kernel methods have been very successful in remote sensing problems because of their ability to deal with high dimensional non-linear data. However, they are computationally expensive to train when a large amount of samples are used. In this context, while the amount of available remote sensing data has constantly increased, the size of training sets in kernel methods is usually restricted to few thousand samples. In this work, we modified the kernel ridge regression (KRR) training procedure to deal with large scale datasets. In addition, the basis functions in the reproducing kernel Hilbert space are defined as parameters to be also optimized during the training process. This extends the n…
Structured Output SVM for Remote Sensing Image Classification
2011
Traditional kernel classifiers assume independence among the classification outputs. As a consequence, each misclassification receives the same weight in the loss function. Moreover, the kernel function only takes into account the similarity between input values and ignores possible relationships between the classes to be predicted. These assumptions are not consistent for most of real-life problems. In the particular case of remote sensing data, this is not a good assumption either. Segmentation of images acquired by airborne or satellite sensors is a very active field of research in which one tries to classify a pixel into a predefined set of classes of interest (e.g. water, grass, trees,…
Optimized Kernel Entropy Components
2016
This work addresses two main issues of the standard Kernel Entropy Component Analysis (KECA) algorithm: the optimization of the kernel decomposition and the optimization of the Gaussian kernel parameter. KECA roughly reduces to a sorting of the importance of kernel eigenvectors by entropy instead of by variance as in Kernel Principal Components Analysis. In this work, we propose an extension of the KECA method, named Optimized KECA (OKECA), that directly extracts the optimal features retaining most of the data entropy by means of compacting the information in very few features (often in just one or two). The proposed method produces features which have higher expressive power. In particular…
Semisupervised nonlinear feature extraction for image classification
2012
Feature extraction is of paramount importance for an accurate classification of remote sensing images. Techniques based on data transformations are widely used in this context. However, linear feature extraction algorithms, such as the principal component analysis and partial least squares, can address this problem in a suboptimal way because the data relations are often nonlinear. Kernel methods may alleviate this problem only when the structure of the data manifold is properly captured. However, this is difficult to achieve when small-size training sets are available. In these cases, exploiting the information contained in unlabeled samples together with the available training data can si…
Kernel-Based Inference of Functions Over Graphs
2018
Abstract The study of networks has witnessed an explosive growth over the past decades with several ground-breaking methods introduced. A particularly interesting—and prevalent in several fields of study—problem is that of inferring a function defined over the nodes of a network. This work presents a versatile kernel-based framework for tackling this inference problem that naturally subsumes and generalizes the reconstruction approaches put forth recently for the signal processing by the community studying graphs. Both the static and the dynamic settings are considered along with effective modeling approaches for addressing real-world problems. The analytical discussion herein is complement…
Model selection based product kernel learning for regression on graphs
2013
The choice of a suitable graph kernel is intrinsically hard and often cannot be made in an informed manner for a given dataset. Methods for multiple kernel learning offer a possible remedy, as they combine and weight kernels on the basis of a labeled training set of molecules to define a new kernel. Whereas most methods for multiple kernel learning focus on learning convex linear combinations of kernels, we propose to combine kernels in products, which theoretically enables higher expressiveness. In experiments on ten publicly available chemical QSAR datasets we show that product kernel learning is on no dataset significantly worse than any of the competing kernel methods and on average the…
A structural cluster kernel for learning on graphs
2012
In recent years, graph kernels have received considerable interest within the machine learning and data mining community. Here, we introduce a novel approach enabling kernel methods to utilize additional information hidden in the structural neighborhood of the graphs under consideration. Our novel structural cluster kernel (SCK) incorporates similarities induced by a structural clustering algorithm to improve state-of-the-art graph kernels. The approach taken is based on the idea that graph similarity can not only be described by the similarity between the graphs themselves, but also by the similarity they possess with respect to their structural neighborhood. We applied our novel kernel in…
A Novel System for Multi-level Crohn’s Disease Classification and Grading Based on a Multiclass Support Vector Machine
2020
Crohn’s disease (CD) is a chronic inflammatory condition of the gastrointestinal tract that can highly alter patient’s quality of life. Diagnostic imaging, such as Enterography Magnetic Resonance Imaging (E-MRI), provides crucial information for CD activity assessment. Automatic learning methods play a fundamental role in the classification of CD and allow to avoid the long and expensive manual classification process by radiologists. This paper presents a novel classification method that uses a multiclass Support Vector Machine (SVM) based on a Radial Basis Function (RBF) kernel for the grading of CD inflammatory activity. To validate the system, we have used a dataset composed of 800 E-MRI…